The Eigen-Decomposition: Eigenvalues and Eigenvectors
نویسنده
چکیده
Eigenvectors and eigenvalues are numbers and vectors associated to square matrices, and together they provide the eigen-decomposition of a matrix which analyzes the structure of this matrix. Even though the eigen-decomposition does not exist for all square matrices, it has a particularly simple expression for a class of matrices often used in multivariate analysis such as correlation, covariance, or cross-product matrices. The eigen-decomposition of this type of matrices is important in statistics because it is used to find the maximum (or minimum) of functions involving these matrices. For example, principal component analysis is obtained from the eigen-decompositionof a covariancematrix and gives the least square estimate of the original data matrix. Eigenvectors and eigenvalues are also referred to as characteristic vectors and latent roots or characteristic equation (in German, “eigen”means “specific of” or “characteristic of”). The set of eigenvalues of a matrix is also called its spectrum.
منابع مشابه
Robust Estimation of Distribution by Shrinkage Technique
Most pattern recognition applications require the eigenvalues and eigenvectors of the covariance matrix. It is well known that when the number of training samples is small, the eigenvalues of the covariance matrix contains bias, and the bias degrades recognition performance. There are some methods which ignore the small eigenvalues, or acquire better estimates of the covariance matrix by correc...
متن کاملA Decomposition Method for Weighted Least Squares Low-rank Approximation of Symmetric Matrices
Least squares approximation of a symmetric matrix C by a symmetric positive definite matrix Ĉ of rank at most p is a classical problem. It is typically solved by computing an eigen-decomposition of C and by truncating the eigen-decomposition by only using the eigenvectors and associated with the min(p, q) largest positive eigenvalues of C. Here q is the number of positive eigenvalues. Thus if q...
متن کاملTrading off Parallelism and Numerical Stability
[80] K. Veseli c. A quadratically convergent Jacobi-like method for real matrices with complex conjugate eigenvalues. [82] D. Watkins and L. Elsner. Convergence of algorithms of decomposition type for the eigenvalue problem. [83] Zhonggang Zeng. Homotopy-determinant algorithm for solving matrix eigenvalue problems and its parallelizations. [69] G. Shro. A parallel algorithm for the eigenvalues ...
متن کاملOn the Spectral Properties of Matrices Associated with Trend Filters
This note is concerned with the spectral properties of matrices associated with linear smoothers. We derive analytical results on the eigenvalues and eigenvectors of smoothing matrices by interpreting the latter as perturbations of matrices belonging to algebras with known spectral properties, such as the Circulant and the generalised Tau. These results are used to characterise the properties o...
متن کاملPerturbation Theory for Rectangular Matrix Pencils Perturbation Theory for Rectangular Matrix Pencils
abstract The theory of eigenvalues and eigenvectors of rectangular matrix pencils is complicated by the fact that arbitrarily small perturbations of the pencil can cause them disappear. However, there are applications in which the properties of the pencil ensure the existence of eigen-values and eigenvectors. In this paper it is shown how to develop a perturbation theory for such pencils. ABSTR...
متن کامل